Sparse Nonlinear Support Vector Machines via Stochastic Approximation
نویسندگان
چکیده
Nonlinear support vector machines (SVMs) are more broadly useful than linear SVMs, as they can find more accurate predictors when the data under consideration have curved intrinsic category boundaries. Nonlinear SVM formulations based on kernels are, however, often much more expensive and data-intensive to solve than linear SVMs, and the resulting classifiers can be expensive to apply. This paper describes approaches for solving nonlinear SVMs on very large datasets. The approaches make use of low-rank approximations to the kernel (constructed using randomized methods), minimization of a reduced primal formulation using stochastic approximation, and recovery of a sparse classifier that can be applied inexpensively to new data.
منابع مشابه
A Robust LS-SVM Regression
In comparison to the original SVM, which involves a quadratic programming task; LS–SVM simplifies the required computation, but unfortunately the sparseness of standard SVM is lost. Another problem is that LS-SVM is only optimal if the training samples are corrupted by Gaussian noise. In Least Squares SVM (LS–SVM), the nonlinear solution is obtained, by first mapping the input vector to a high ...
متن کاملStochastic Gradient Twin Support Vector Machine for Large Scale Problems
Stochastic gradient descent algorithm has been successfully applied on support vector machines (called PEGASOS) for many classification problems. In this paper, stochastic gradient descent algorithm is investigated to twin support vector machines for classification. Compared with PEGASOS, the proposed stochastic gradient twin support vector machines (SGTSVM) is insensitive on stochastic samplin...
متن کاملApproximate Stochastic Subgradient Estimation Training for Support Vector Machines
Subgradient algorithms for training support vector machines have been quite successful for solving largescale and online learning problems. However, they have been restricted to linear kernels and strongly convex formulations. This paper describes efficient subgradient approaches without such limitations. Our approaches make use of randomized low-dimensional approximations to nonlinear kernels,...
متن کاملLoad Forecasting Using Fixed-Size Least Squares Support Vector Machines
Based on the Nyström approximation and the primal-dual formulation of Least Squares Support Vector Machines (LS-SVM), it becomes possible to apply a nonlinear model to a large scale regression problem. This is done by using a sparse approximation of the nonlinear mapping induced by the kernel matrix, with an active selection of support vectors based on quadratic Renyi entropy criteria. The meth...
متن کاملNonlinear Modelling and Support Vector Machines
Neural networks such as multilayer perceptrons and radial basis function networks have been very successful in a wide range of problems. In this paper we give a short introduction to some new developments related to support vector machines (SVM), a new class of kernelbased techniques introduced within statistical learning theory and structural risk minimization. This new approach leads to solvi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010